Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 16 de 16
Filtrar
1.
Curr Protoc ; 4(5): e1043, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38706422

RESUMEN

Trypanosoma brucei (Tb) is the causative agent of human African trypanosomiasis (HAT), also known as sleeping sickness, which can be fatal if left untreated. An understanding of the parasite's cellular metabolism is vital for the discovery of new antitrypanosomal drugs and for disease eradication. Metabolomics can be used to analyze numerous metabolic pathways described as essential to Tb. brucei but has some limitations linked to the metabolites' physicochemical properties and the extraction process. To develop an optimized method for extracting and analyzing Tb. brucei metabolites, we tested the three most commonly used extraction methods, analyzed the extracts by hydrophilic interaction liquid chromatography high-resolution mass spectrometry (HILIC LC-HRMS), and further evaluated the results using quantitative criteria including the number, intensity, reproducibility, and variability of features, as well as qualitative criteria such as the specific coverage of relevant metabolites. Here, we present the resulting protocols for untargeted metabolomic analysis of Tb. brucei using (HILIC LC-HRMS). © 2024 Wiley Periodicals LLC. Basic Protocol 1: Culture of Trypanosoma brucei brucei parasites Basic Protocol 2: Preparation of samples for metabolomic analysis of Trypanosoma brucei brucei Basic Protocol 3: LC-HRMS-based metabolomic data analysis of Trypanosoma brucei brucei.


Asunto(s)
Metabolómica , Trypanosoma brucei brucei , Trypanosoma brucei brucei/metabolismo , Metabolómica/métodos , Cromatografía Liquida/métodos , Espectrometría de Masas/métodos , Tripanosomiasis Africana/parasitología
2.
Biochem Biophys Res Commun ; 703: 149684, 2024 Apr 09.
Artículo en Inglés | MEDLINE | ID: mdl-38367514

RESUMEN

Malaria is a parasitic disease that remains a global concern and the subject of many studies. Metabolomics has emerged as an approach to better comprehend complex pathogens and discover possible drug targets, thus giving new insights that can aid in the development of antimalarial therapies. However, there is no standardized method to extract metabolites from in vitro Plasmodium falciparum intraerythrocytic parasites, the stage that causes malaria. Additionally, most methods are developed with either LC-MS or NMR analysis in mind, and have rarely been evaluated with both tools. In this work, three extraction methods frequently found in the literature were reproduced and samples were analyzed through both LC-MS and 1H NMR, and evaluated in order to reveal which is the most repeatable and consistent through an array of different tools, including chemometrics, peak detection and annotation. The most reliable method in this study proved to be a double extraction with methanol and methanol/water (80:20, v/v). Metabolomic studies in the field should move towards standardization of methodologies and the use of both LC-MS and 1H NMR in order to make data more comparable between studies and facilitate the achievement of biologically interpretable information.


Asunto(s)
Antimaláricos , Malaria , Humanos , Plasmodium falciparum/metabolismo , Cromatografía Líquida con Espectrometría de Masas , Cromatografía Liquida/métodos , Espectroscopía de Protones por Resonancia Magnética , Metanol/metabolismo , Espectrometría de Masas en Tándem/métodos , Metabolómica/métodos
3.
Metabolomics ; 20(2): 25, 2024 Feb 23.
Artículo en Inglés | MEDLINE | ID: mdl-38393408

RESUMEN

INTRODUCTION: Human African trypanosomiasis, commonly known as sleeping sickness, is a vector-borne parasitic disease prevalent in sub-Saharan Africa and transmitted by the tsetse fly. Suramin, a medication with a long history of clinical use, has demonstrated varied modes of action against Trypanosoma brucei. This study employs a comprehensive workflow to investigate the metabolic effects of suramin on T. brucei, utilizing a multimodal metabolomics approach. OBJECTIVES: The primary aim of this study is to comprehensively analyze the metabolic impact of suramin on T. brucei using a combined liquid chromatography-mass spectrometry (LC-MS) and nuclear magnetic resonance spectroscopy (NMR) approach. Statistical analyses, encompassing multivariate analysis and pathway enrichment analysis, are applied to elucidate significant variations and metabolic changes resulting from suramin treatment. METHODS: A detailed methodology involving the integration of high-resolution data from LC-MS and NMR techniques is presented. The study conducts a thorough analysis of metabolite profiles in both suramin-treated and control T. brucei brucei samples. Statistical techniques, including ANOVA-simultaneous component analysis (ASCA), principal component analysis (PCA), ANOVA 2 analysis, and bootstrap tests, are employed to discern the effects of suramin treatment on the metabolomics outcomes. RESULTS: Our investigation reveals substantial differences in metabolic profiles between the control and suramin-treated groups. ASCA and PCA analysis confirm distinct separation between these groups in both MS-negative and NMR analyses. Furthermore, ANOVA 2 analysis and bootstrap tests confirmed the significance of treatment, time, and interaction effects on the metabolomics outcomes. Functional analysis of the data from LC-MS highlighted the impact of treatment on amino-acid, and amino-sugar and nucleotide-sugar metabolism, while time effects were observed on carbon intermediary metabolism (notably glycolysis and di- and tricarboxylic acids of the succinate production pathway and tricarboxylic acid (TCA) cycle). CONCLUSION: Through the integration of LC-MS and NMR techniques coupled with advanced statistical analyses, this study identifies distinctive metabolic signatures and pathways associated with suramin treatment in T. brucei. These findings contribute to a deeper understanding of the pharmacological impact of suramin and have the potential to inform the development of more efficacious therapeutic strategies against African trypanosomiasis.


Asunto(s)
Trypanosoma brucei brucei , Tripanosomiasis Africana , Animales , Humanos , Suramina/farmacología , Suramina/metabolismo , Suramina/uso terapéutico , Tripanosomiasis Africana/tratamiento farmacológico , Tripanosomiasis Africana/parasitología , Metabolómica/métodos , Trypanosoma brucei brucei/metabolismo , Flujo de Trabajo
4.
J Mol Med (Berl) ; 98(12): 1737-1751, 2020 12.
Artículo en Inglés | MEDLINE | ID: mdl-33079232

RESUMEN

Neovascular age-related macular degeneration (nAMD) is the leading cause of blindness in aging populations. Here, we applied metabolomics to human sera of patients with nAMD during an active (exudative) phase of the pathology and found higher lactate levels and a shift in the lipoprotein profile (increased VLDL-LDL/HDL ratio). Similar metabolomics changes were detected in the sera of mice subjected to laser-induced choroidal neovascularization (CNV). In this experimental model, we provide evidence for two sites of lactate production: first, a local one in the injured eye, and second a systemic site associated with the recruitment of bone marrow-derived inflammatory cells. Mechanistically, lactate promotes the angiogenic response and M2-like macrophage accumulation in the eyes. The therapeutic potential of our findings is demonstrated by the pharmacological control of lactate levels through pyruvate dehydrogenase kinase (PDK) inhibition by dichloroacetic acid (DCA). Mice treated with DCA exhibited normalized lactate levels and lipoprotein profiles, and inhibited CNV formation. Collectively, our findings implicate the key role of the PDK/lactate axis in AMD pathogenesis and reveal that the regulation of PDK activity has potential therapeutic value in this ocular disease. The results indicate that the lipoprotein profile is a traceable pattern that is worth considering for patient follow-up. KEY MESSAGES: Lactate and lipoprotein profile are associated with the active phase of AMD and CNV development. Lactate is a relevant and functional metabolite correlated with AMD progression. Modulating lactate through pyruvate dehydrogenase kinase led to a decrease of CNV progression. Pyruvate dehydrogenase kinase is a new therapeutic target for neovascular AMD.


Asunto(s)
Ácido Láctico/metabolismo , Redes y Vías Metabólicas , Piruvato Deshidrogenasa Quinasa Acetil-Transferidora/metabolismo , Transducción de Señal , Inhibidores de la Angiogénesis/farmacología , Inhibidores de la Angiogénesis/uso terapéutico , Biomarcadores , Neovascularización Coroidal/etiología , Neovascularización Coroidal/metabolismo , Neovascularización Coroidal/patología , Manejo de la Enfermedad , Humanos , Degeneración Macular/tratamiento farmacológico , Degeneración Macular/etiología , Degeneración Macular/metabolismo , Degeneración Macular/patología , Redes y Vías Metabólicas/efectos de los fármacos , Metaboloma , Metabolómica/métodos , Terapia Molecular Dirigida , Inhibidores de Proteínas Quinasas/farmacología , Inhibidores de Proteínas Quinasas/uso terapéutico , Piruvato Deshidrogenasa Quinasa Acetil-Transferidora/antagonistas & inhibidores , Transducción de Señal/efectos de los fármacos
5.
Metabolomics ; 16(4): 42, 2020 03 18.
Artículo en Inglés | MEDLINE | ID: mdl-32189152

RESUMEN

INTRODUCTION: The use of 2D NMR data sources (COSY in this paper) allows to reach general metabolomics results which are at least as good as the results obtained with 1D NMR data, and this with a less advanced and less complex level of pre-processing. But a major issue still exists and can largely slow down a generalized use of 2D data sources in metabolomics: the experiment duration. OBJECTIVE: The goal of this paper is to overcome the experiment duration issue in our recently published MIC strategy by considering faster 2D COSY acquisition techniques: a conventional COSY with a reduced number of transients and the use of the Non-Uniform Sampling (NUS) method. These faster alternatives are all submitted to novel 2D pre-processing workflows and to Metabolomic Informative Content analyses. Eventually, results are compared to those obtained with conventional COSY spectra. METHODS: To pre-process the 2D data sources, the Global Peak List (GPL) workflow and the Vectorization workflow are used. To compare this data sources and to detect the more informative one(s), MIC (Metabolomic Informative Content) indexes are used, based on clustering and inertia measures of quality. RESULTS: Results are discussed according to a multi-factor experimental design (which is unsupervised and based on human urine samples). Descriptive PCA results and MIC indexes are shown, leading to the direct and objective comparison of the different data sets. CONCLUSION: In conclusion, it is demonstrated that conventional COSY spectra recorded with only one transient per increment and COSY spectra recorded with 50% of non-uniform sampling provide very similar MIC results as the initial COSY recorded with four transients, but in a much shorter time. Consequently, using techniques like the reduction of the number of transients or NUS can really open the door to a potential high-throughput use of 2D COSY spectra in metabolomics.


Asunto(s)
Metabolómica/métodos , Flujo de Trabajo , Algoritmos , Humanos , Espectroscopía de Resonancia Magnética , Análisis de Componente Principal
6.
Metabolomics ; 15(4): 63, 2019 04 16.
Artículo en Inglés | MEDLINE | ID: mdl-30993405

RESUMEN

INTRODUCTION: The pre-processing of analytical data in metabolomics must be considered as a whole to allow the construction of a global and unique object for any further simultaneous data analysis or multivariate statistical modelling. For 1D 1H-NMR metabolomics experiments, best practices for data pre-processing are well defined, but not yet for 2D experiments (for instance COSY in this paper). OBJECTIVE: By considering the added value of a second dimension, the objective is to propose two workflows dedicated to 2D NMR data handling and preparation (the Global Peak List and Vectorization approaches) and to compare them (with respect to each other and with 1D standards). This will allow to detect which methodology is the best in terms of amount of metabolomic content and to explore the advantages of the selected workflow in distinguishing among treatment groups and identifying relevant biomarkers. Therefore, this paper explores both the necessity of novel 2D pre-processing workflows, the evaluation of their quality and the evaluation of their performance in the subsequent determination of accurate (2D) biomarkers. METHODS: To select the more informative data source, MIC (Metabolomic Informative Content) indexes are used, based on clustering and inertia measures of quality. Then, to highlight biomarkers or critical spectral zones, the PLS-DA model is used, along with more advanced sparse algorithms (sPLS and L-sOPLS). RESULTS: Results are discussed according to two different experimental designs (one which is unsupervised and based on human urine samples, and the other which is controlled and based on spiked serum media). MIC indexes are shown, leading to the choice of the more relevant workflow to use thereafter. Finally, biomarkers are provided for each case and the predictive power of each candidate model is assessed with cross-validated measures of RMSEP. CONCLUSION: In conclusion, it is shown that no solution can be universally the best in every case, but that 2D experiments allow to clearly find relevant cross peak biomarkers even with a poor initial separability between groups. The MIC measures linked with the candidate workflows (2D GPL, 2D vectorization, 1D, and with specific parameters) lead to visualize which data set must be used as a priority to more easily find biomarkers. The diversity of data sources, mainly 1D versus 2D, may often lead to complementary or confirmatory results.


Asunto(s)
Biología Computacional/métodos , Espectroscopía de Resonancia Magnética/métodos , Metabolómica/métodos , Algoritmos , Biomarcadores , Análisis de Datos , Imagen por Resonancia Magnética/métodos , Programas Informáticos , Flujo de Trabajo
7.
Anal Chim Acta ; 1019: 1-13, 2018 Aug 17.
Artículo en Inglés | MEDLINE | ID: mdl-29625674

RESUMEN

In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines.


Asunto(s)
Metabolómica , Espectroscopía de Protones por Resonancia Magnética , Programas Informáticos
8.
Stat Med ; 35(14): 2328-58, 2016 06 30.
Artículo en Inglés | MEDLINE | ID: mdl-26822948

RESUMEN

Two main methodologies for assessing equivalence in method-comparison studies are presented separately in the literature. The first one is the well-known and widely applied Bland-Altman approach with its agreement intervals, where two methods are considered interchangeable if their differences are not clinically significant. The second approach is based on errors-in-variables regression in a classical (X,Y) plot and focuses on confidence intervals, whereby two methods are considered equivalent when providing similar measures notwithstanding the random measurement errors. This paper reconciles these two methodologies and shows their similarities and differences using both real data and simulations. A new consistent correlated-errors-in-variables regression is introduced as the errors are shown to be correlated in the Bland-Altman plot. Indeed, the coverage probabilities collapse and the biases soar when this correlation is ignored. Novel tolerance intervals are compared with agreement intervals with or without replicated data, and novel predictive intervals are introduced to predict a single measure in an (X,Y) plot or in a Bland-Atman plot with excellent coverage probabilities. We conclude that the (correlated)-errors-in-variables regressions should not be avoided in method comparison studies, although the Bland-Altman approach is usually applied to avert their complexity. We argue that tolerance or predictive intervals are better alternatives than agreement intervals, and we provide guidelines for practitioners regarding method comparison studies. Copyright © 2016 John Wiley & Sons, Ltd.


Asunto(s)
Modelos Estadísticos , Sesgo , Bioestadística , Determinación de la Presión Sanguínea/estadística & datos numéricos , Simulación por Computador , Intervalos de Confianza , Estudios de Equivalencia como Asunto , Humanos , Probabilidad , Análisis de Regresión
9.
Anal Chim Acta ; 705(1-2): 193-206, 2011 Oct 31.
Artículo en Inglés | MEDLINE | ID: mdl-21962362

RESUMEN

Methods validation is mandatory in order to assess the fitness of purpose of the developed analytical method. Of core importance at the end of the validation is the evaluation of the reliability of the individual results that will be generated during the routine application of the method. Regulatory guidelines provide a general framework to assess the validity of a method, but none address the issue of results reliability. In this study, a Bayesian approach is proposed to address this concern. Results reliability is defined here as "the probability (π) of an analytical method to provide analytical results (X) within predefined acceptance limits (±λ) around their reference or conventional true concentration values (µ(T)) over a defined concentration range and under given environmental and operating conditions." By providing the minimum reliability probability (π(min)) needed for the subsequent routine application of the method, as well as specifications or acceptance limits (±λ), the proposed Bayesian approach provides the effective probability of obtaining reliable future analytical results over the whole concentration range investigated. This is summarised in a single graph: the reliability profile. This Bayesian reliability profile is also compared to two frequentist approaches, the first one derived from the work of Dewé et al. [W. Dewé, B. Govaerts, B. Boulanger, E. Rozet, P. Chiap, Ph. Hubert, Chemometr. Intell. Lab. Syst. 85 (2007) 262-268] and the second proposed by Govaerts et al. [B. Govaerts, W. Dewé, M. Maumy, B. Boulanger, Qual. Reliab. Eng. Int. 24 (2008) 667-680]. Furthermore, to illustrate the applicability of the Bayesian reliability profile, this approach is also applied here to a bioanalytical method dedicated to the determination of ketoglutaric acid (KG) and hydroxymethylfurfural (HMF) in human plasma by SPE-HPLC-UV.


Asunto(s)
Cromatografía Líquida de Alta Presión/métodos , Reproducibilidad de los Resultados , Extracción en Fase Sólida/métodos , Teorema de Bayes , Simulación por Computador , Furaldehído/análogos & derivados , Furaldehído/sangre , Humanos , Ácidos Cetoglutáricos/sangre , Modelos Estadísticos , Probabilidad , Espectrofotometría Ultravioleta/métodos
10.
BMC Bioinformatics ; 12: 413, 2011 Oct 25.
Artículo en Inglés | MEDLINE | ID: mdl-22026942

RESUMEN

BACKGROUND: The standard approach for preprocessing spotted microarray data is to subtract the local background intensity from the spot foreground intensity, to perform a log2 transformation and to normalize the data with a global median or a lowess normalization. Although well motivated, standard approaches for background correction and for transformation have been widely criticized because they produce high variance at low intensities. Whereas various alternatives to the standard background correction methods and to log2 transformation were proposed, impacts of both successive preprocessing steps were not compared in an objective way. RESULTS: In this study, we assessed the impact of eight preprocessing methods combining four background correction methods and two transformations (the log2 and the glog), by using data from the MAQC study. The current results indicate that most preprocessing methods produce fold-change compression at low intensities. Fold-change compression was minimized using the Standard and the Edwards background correction methods coupled with a log2 transformation. The drawback of both methods is a high variance at low intensities which consequently produced poor estimations of the p-values. On the other hand, effective stabilization of the variance as well as better estimations of the p-values were observed after the glog transformation. CONCLUSION: As both fold-change magnitudes and p-values are important in the context of microarray class comparison studies, we therefore recommend to combine the Edwards correction with a hybrid transformation method that uses the log2 transformation to estimate fold-change magnitudes and the glog transformation to estimate p-values.


Asunto(s)
Perfilación de la Expresión Génica/métodos , Análisis de Secuencia por Matrices de Oligonucleótidos/métodos , Humanos , Control de Calidad
11.
Mycorrhiza ; 21(5): 443-449, 2011 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-21553021

RESUMEN

The vegetative compatibility of the arbuscular mycorrhizal fungus (AMF) Glomus clarum MUCL 46238 was evaluated after continuous exposure to fenhexamid, a sterol biosynthesis inhibitor (SBI). Three lineages of this AMF were cultured in vitro for five generations in association with Ri T-DNA transformed carrot roots in the presence of 0, 5 or 10 mg l(-1) of fenhexamid. Whatever the AMF generation, fenhexamid at 5 and 10 mg l(-1) had no significant impact on the number of spores produced. However, vegetative compatibility tests (VCT) conducted with spores from the three lineages, in the presence of 10 mg l(-1) of fenhexamid, impacted the anastomosis process. At this concentration, the morphology of the germ tubes was modified. In addition, nitrotetrazolium-trypan blue staining revealed that 10 mg l(-l) of fenhexamid significantly reduced the probability of fusion between the germ tubes regardless of the culture conditions (i.e. absence or presence of fenhexamid) preceding the VCT. Our results demonstrated that spore production was not affected by fenhexamid, while anastomosis between germ tubes was decreased. This suggested that high concentrations, accumulation or repeated application of this SBI fungicide may impact the community structure of AMF in soil.


Asunto(s)
Amidas/farmacología , Glomeromycota/efectos de los fármacos , Glomeromycota/crecimiento & desarrollo , Esteroles/antagonistas & inhibidores , Esteroles/biosíntesis , Fungicidas Industriales/farmacología , Glomeromycota/metabolismo , Esporas Fúngicas/efectos de los fármacos , Esporas Fúngicas/crecimiento & desarrollo , Esporas Fúngicas/metabolismo
12.
Noise Health ; 13(50): 64-70, 2011.
Artículo en Inglés | MEDLINE | ID: mdl-21173489

RESUMEN

The armed forces are highly exposed to occupational noise. The aim of this study was to evaluate the prevalence and noise exposures associated with the severity of hearing loss (HL) in a Belgian military population. A cross-sectional study was carried out at the Centre for Medical Expertise (CME) and in four Units of Occupational Medicine (UOM). Hearing thresholds were determined by audiometry. The examination included a questionnaire on hearing-related medical history, and noise exposure in military and leisure time activity. A multinomial logistic regression model was used to assess the association of the severity of HL with tinnitus, with the military occupation, and with noise exposures. Of the 2055 subjects aged 18-55 years, 661 (32.2%) had a slight HL (25-40 dB), 280 (13.6%) had a moderate HL (45-60 dB) and 206 (10.0%) had a severe HL (> 60 dB) of 4 and 6 kHz for both ears. The prevalence of slight, moderate and severe HL increased significantly with age and was higher for subjects from Paracommando and infantry units. Fighting in Built-Up Area (FIBUA) training, shooting with large caliber weapons, and participation in military exercises were the best determinants of HL in this population. These results suggest that subjects from infantry and Paracommando units run the highest risk of HL because they are exposed to very loud noises in their professional life, like large caliber shooting and FIBUA training.


Asunto(s)
Pérdida Auditiva Provocada por Ruido/epidemiología , Personal Militar/estadística & datos numéricos , Ruido en el Ambiente de Trabajo/efectos adversos , Adolescente , Adulto , Bélgica/epidemiología , Estudios Transversales , Femenino , Humanos , Masculino , Persona de Mediana Edad , Ruido en el Ambiente de Trabajo/estadística & datos numéricos , Exposición Profesional/efectos adversos , Exposición Profesional/estadística & datos numéricos , Oportunidad Relativa , Prevalencia , Adulto Joven
13.
Chemistry ; 15(25): 6267-78, 2009 Jun 15.
Artículo en Inglés | MEDLINE | ID: mdl-19437473

RESUMEN

A library of catalysts was designed for asymmetric-hydrogen transfer to acetophenone. At first, the whole library was submitted to evaluation using high-throughput experiments (HTE). The catalysts were listed in ascending order, with respect to their performance, and best catalysts were identified. In the second step, various simulated evolution experiments, based on a genetic algorithm, were applied to this library. A small part of the library, called the mother generation (G0), thus evolved from generation to generation. The goal was to use our collection of HTE data to adjust the parameters of the genetic algorithm, in order to obtain a maximum of the best catalysts within a minimal number of generations. It was namely found that simulated evolution's results depended on the selection of G0 and that a random G0 should be preferred. We also demonstrated that it was possible to get 5 to 6 of the ten best catalysts while investigating only 10 % of the library. Moreover, we developed a double algorithm making this result still achievable if the evolution started with one of the worst G0.


Asunto(s)
Acetofenonas/química , Algoritmos , Hidrogenación , Evolución Biológica , Catálisis , Bases de Datos Genéticas , Bibliotecas de Moléculas Pequeñas
14.
Talanta ; 79(1): 77-85, 2009 Jun 30.
Artículo en Inglés | MEDLINE | ID: mdl-19376347

RESUMEN

One of the major issues within the context of the fully automated development of chromatographic methods consists of the automated detection and identification of peaks coming from complex samples such as multi-component pharmaceutical formulations or stability studies of these formulations. The same problem can also occur with plant materials or biological matrices. This step is thus critical and time-consuming, especially when a Design of Experiments (DOE) approach is used to generate chromatograms. The use of DOE will often maximize the changes of the analytical conditions in order to explore an experimental domain. Unfortunately, this generally provides very different and "unpredictable" chromatograms which can be difficult to interpret, thus complicating peak detection and peak tracking (i.e. matching peaks among all the chromatograms). In this context, Independent Components Analysis (ICA), a new statistically based signal processing methods was investigated to solve this problem. The ICA principle assumes that the observed signal is the resultant of several phenomena (known as sources) and that all these sources are statistically independent. Under those assumptions, ICA is able to recover the sources which will have a high probability of representing the constitutive components of a chromatogram. In the present study, ICA was successfully applied for the first time to HPLC-UV-DAD chromatograms and it was shown that ICA allows differentiation of noise and artifact components from those of interest by applying clustering methods based on high-order statistics computed on these components. Furthermore, on the basis of the described numerical strategy, it was also possible to reconstruct a cleaned chromatogram with minimum influence of noise and baseline artifacts. This can present a significant advance towards the objective of providing helpful tools for the automated development of liquid chromatography (LC) methods. It seems that analytical investigations could be shortened when using this type of methodologies.


Asunto(s)
Cromatografía Líquida de Alta Presión/métodos , Mezclas Complejas/análisis , Procesamiento de Señales Asistido por Computador , Automatización , Cromatografía Líquida de Alta Presión/instrumentación , Análisis por Conglomerados , Análisis de Componente Principal/métodos
15.
Br J Nutr ; 102(1): 37-53, 2009 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-19138439

RESUMEN

The effect of two digestible protein levels (310 and 469 g/kg DM) on the relative lysine (Lys; g Lys/kg DM or g Lys/100 g protein) and the absolute Lys (g Lys intake/kg 0.75 per d) requirements was studied in rainbow trout fry using a dose-response trial. At each protein level, sixteen isoenergetic (22-23 MJ digestible energy/kg DM) diets were tested, involving a full range (2-70 g/kg DM) of sixteen Lys levels. Each diet was given to one group of sixty rainbow trout fry (mean initial body weight 0.78 g) reared at 15 degrees C for 31 feeding d. The Lys requirements were estimated based on the relationships between weight, protein, and Lys gains (g/kg 0.75 per d) and Lys concentration (g/kg DM or g/100 g protein) or Lys intake (g/kg 0.75 per d), using the broken-line model (BLM) and the non-linear four-parameter saturation kinetics model (SKM-4). Both the model and the response criterion chosen markedly impacted the relative Lys requirement. The relative Lys requirement for Lys gain of rainbow trout estimated with the BLM (and SKM-4 at 90 % of the maximum response) increased from 16.8 (19.6) g/kg DM at a low protein level to 23.4 (24.5) g/kg DM at a high protein level. However, the dietary protein content affected neither the absolute Lys requirement nor the relative Lys requirement expressed as g Lys/100 g protein nor the Lys requirement for maintenance (21 mg Lys/kg 0.75 per d).


Asunto(s)
Dieta , Proteínas en la Dieta/administración & dosificación , Lisina/administración & dosificación , Oncorhynchus mykiss/crecimiento & desarrollo , Oncorhynchus mykiss/metabolismo , Animales , Proteínas en la Dieta/metabolismo , Digestión , Lisina/metabolismo , Modelos Biológicos , Necesidades Nutricionales , Distribución Aleatoria
16.
J Chem Inf Model ; 45(3): 758-67, 2005.
Artículo en Inglés | MEDLINE | ID: mdl-15921465

RESUMEN

Combinatorial chemistry is widely used in drug discovery. Once a lead compound has been identified, a series of R-groups and reagents can be selected and combined to generate new potential drugs. The combinatorial nature of this problem leads to chemical libraries containing usually a very large number of virtual compounds, far too large to permit their chemical synthesis. Therefore, one often wants to select a subset of "good" reagents for each R-group of reagents and synthesize all their possible combinations. In this research, one encounters some difficulties. First, the selection of reagents has to be done such that the compounds of the resulting sublibrary simultaneously optimize a series of chemical properties. For each compound, a desirability index, a concept proposed by Harrington,(20) is used to summarize those properties in one fitness value. Then a loss function is used as objective criteria to globally quantify the quality of a sublibrary. Second, there are a huge number of possible sublibraries, and the solutions space has to be explored as fast as possible. The WEALD algorithm proposed in this paper starts with a random solution and iterates by applying exchanges, a simple method proposed by Fedorov(13) and often used in the generation of optimal designs. Those exchanges are guided by a weighting of the reagents adapted recursively as the solutions space is explored. The algorithm is applied on a real database and reveals to converge rapidly. It is compared to results given by two other algorithms presented in the combinatorial chemistry literature: the Ultrafast algorithm of D. Agrafiotis and V. Lobanov and the Piccolo algorithm of W. Zheng et al.


Asunto(s)
Algoritmos , Técnicas Químicas Combinatorias
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...